Dimension Independent Matrix Square using MapReduce (DIMSUM)
نویسندگان
چکیده
We compute the singular values of an m × n sparse matrix A in a distributed setting, without communication dependence on m, which is useful for very large m. In particular, we give a simple nonadaptive sampling scheme where the singular values of A are estimated within relative error with constant probability. Our proven bounds focus on the MapReduce framework, which has become the de facto tool for handling such large matrices that cannot be stored or even streamed through a single machine. On the way, we give a general method to compute AA. We preserve singular values of AA with relative error with shuffle size O(n/ ) and reduce-key complexity O(n/ ). We further show that if only specific entries of AA are required and A has nonnegative entries, then we can reduce the shuffle size to O(n log(n)/s) and reduce-key complexity to O(log(n)/s), where s is the minimum cosine similarity for the entries being estimated. All of our bounds are independent of m, the larger dimension. We provide open-source implementations in Spark and Scalding, along with experiments in an industrial setting.
منابع مشابه
Dimension Independent Matrix Square using MapReduce
We compute the singular values of an m × n sparse matrix A in a distributed setting, without communication dependence on m, which is useful for very large m. In particular, we give a simple nonadaptive sampling scheme where the singular values of A are estimated within relative error with constant probability. Our proven bounds focus on the MapReduce framework, which has become the de facto too...
متن کاملDimension independent similarity computation
We present a suite of algorithms for Dimension Independent Similarity Computation (DISCO) to compute all pairwise similarities between very high-dimensional sparse vectors. All of our results are provably independent of dimension, meaning that apart from the initial cost of trivially reading in the data, all subsequent operations are independent of the dimension; thus the dimension can be very ...
متن کاملComparison of Conjugate Gradient Method and Jacobi Method Algorithm on MapReduce Framework
As the volume of data continues to grow across many areas of science, parallel computing is a solution to the scaling problem many applications face. The goal of a parallel program is to enable the execution of larger problems and to reduce the execution time compared to sequential programs. Among parallel computing frameworks, MapReduce is a framework that enables parallel processing of data o...
متن کاملMapReduce-based Dimensional ETL Made Easy
This paper demonstrates ETLMR, a novel dimensional Extract– Transform–Load (ETL) programming framework that uses MapReduce to achieve scalability. ETLMR has built-in native support of data warehouse (DW) specific constructs such as star schemas, snowflake schemas, and slowly changing dimensions (SCDs). This makes it possible to build MapReduce-based dimensional ETL flows very easily. The ETL pr...
متن کاملA Projected Alternating Least square Approach for Computation of Nonnegative Matrix Factorization
Nonnegative matrix factorization (NMF) is a common method in data mining that have been used in different applications as a dimension reduction, classification or clustering method. Methods in alternating least square (ALS) approach usually used to solve this non-convex minimization problem. At each step of ALS algorithms two convex least square problems should be solved, which causes high com...
متن کامل